首页> 外文OA文献 >A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models
【2h】

A Bootstrap Lasso + Partial Ridge Method to Construct Confidence Intervals for Parameters in High-dimensional Sparse Linear Models

机译:构造置信度的Bootstrap Lasso + partial Ridge方法   高维稀疏线性模型中参数的区间

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。

摘要

For high-dimensional sparse linear models, how to construct confidenceintervals for coefficients remains a difficult question. The main reason is thecomplicated limiting distributions of common estimators such as the Lasso.Several confidence interval construction methods have been developed, andBootstrap Lasso+OLS is notable for its simple technicality, goodinterpretability, and comparable performance with other more complicatedmethods. However, Bootstrap Lasso+OLS depends on the beta-min assumption, atheoretic criterion that is often violated in practice. In this paper, weintroduce a new method called Bootstrap Lasso+Partial Ridge (LPR) to relax thisassumption. LPR is a two-stage estimator: first using Lasso to select featuresand subsequently using Partial Ridge to refit the coefficients. Simulationresults show that Bootstrap LPR outperforms Bootstrap Lasso+OLS when thereexist small but non-zero coefficients, a common situation violating thebeta-min assumption. For such coefficients, compared to Bootstrap Lasso+OLS,confidence intervals constructed by Bootstrap LPR have on average 50% largercoverage probabilities. Bootstrap LPR also has on average 35% shorterconfidence interval lengths than the de-sparsified Lasso methods, regardless ofwhether linear models are misspecified. Additionally, we provide theoreticalguarantees of Bootstrap LPR under appropriate conditions and implement it inthe R package "HDCI."
机译:对于高维稀疏线性模型,如何构造系数的置信区间仍然是一个难题。主要原因是常见估计量(例如Lasso)的限制分布复杂。已经开发了几种置信区间构造方法,Bootstrap Lasso + OLS以其简单的技术性,良好的可解释性以及与其他更复杂的方法可比的性能而著称。但是,Bootstrap Lasso + OLS取决于beta-min的假设,这是在实践中经常违反的理论标准。在本文中,我们引入了一种称为Bootstrap Lasso + Partial Ridge(LPR)的新方法来放松这种假设。 LPR是一个两阶段的估计器:首先使用套索选择特征,然后使用Partial Ridge重新拟合系数。仿真结果表明,当存在小的但非零系数时,Bootstrap LPR优于Bootstrap Lasso + OLS,这是违反beta-min假设的常见情况。对于这样的系数,与Bootstrap Lasso + OLS相比,Bootstrap LPR构造的置信区间平均具有大50%的覆盖率。 Bootstrap LPR的置信区间长度比未简化的Lasso方法平均短35%,无论线性模型是否指定错误。此外,我们在适当的条件下提供Bootstrap LPR的理论保证,并在R程序包“ HDCI”中实现。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号